An Enhanced Neural Word Embedding Model for Transfer Learning
نویسندگان
چکیده
Due to the expansion of data generation, more and natural language processing (NLP) tasks are needing be solved. For this, word representation plays a vital role. Computation-based embedding in various high languages is very useful. However, until now, low-resource such as Bangla have had limited resources available terms models, toolkits, datasets. Considering this fact, paper, an enhanced BanglaFastText model developed using Python two large pre-trained models FastText (Skip-gram cbow). These were trained on collected corpus (around 20 million points text data, which every paragraph considered point). outperformed Facebook’s by significant margin. To evaluate analyze performance these proposed work accomplished classification based three popular textual datasets, machine learning classical approaches, well deep neural network. The evaluations showed superior over existing techniques Facebook for NLP. In addition, original concerning datasets provides excellent results. A toolkit proposed, convenient accessing embedding, obtaining semantic relationships word-by-word or sentence-by-sentence; sentence approaches; also unsupervised finetuning any linguistic dataset.
منابع مشابه
Category Enhanced Word Embedding
Distributed word representations have been demonstrated to be effective in capturing semantic and syntactic regularities. Unsupervised representation learning from large unlabeled corpora can learn similar representations for those words that present similar cooccurrence statistics. Besides local occurrence statistics, global topical information is also important knowledge that may help discrim...
متن کاملBayesian Neural Word Embedding
Recently, several works in the domain of natural language processing presented successful methods for word embedding. Among them, the Skip-Gram (SG) with negative sampling, known also as word2vec, advanced the stateof-the-art of various linguistics tasks. In this paper, we propose a scalable Bayesian neural word embedding algorithm that can be beneficial to general item similarity tasks as well...
متن کاملMorpheme-Enhanced Spectral Word Embedding
Traditional word embedding models only learn word-level semantic information from corpus while neglect the valuable semantic information of words’ internal structures such as morphemes. To address this problem, the goal of this paper is to exploit the morphological information to enhance the quality of word embeddings. Based on spectral method, we propose two word embedding models: Morpheme on ...
متن کاملA model for enhanced heat transfer in an enclosure using Nano-aerosols
In this study, the behavior of nanoparticles using a numerical model is discussed. For this study a model for the expansion in free convection heat transfer and mix in a rectangular container with dimensions of 1 × 4 cm using Nano-aerosols in the air is going when copper nanoparticles, use and by changing the temperature difference between hot and cold wall, we will examine its impact on the ra...
متن کاملLabel Embedding for Transfer Learning
Automatically tagging textual mentions with the concepts, types and entities that they represent are important tasks for which supervised learning has been found to be very effective. In this paper, we consider the problem of exploiting multiple sources of training data with variant ontologies. We present a new transfer learning approach based on embedding multiple label sets in a shared space,...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied sciences
سال: 2022
ISSN: ['2076-3417']
DOI: https://doi.org/10.3390/app12062848